home *** CD-ROM | disk | FTP | other *** search
- Path: news.uni-c.dk!inet!perjac
- From: perjac@inet.uni-c.dk (Per Jacobsen)
- Newsgroups: comp.sys.amiga.misc
- Subject: Re: WWW-page
- Date: 3 Feb 1996 05:49:29 GMT
- Organization: The Shadows' Parking lot
- Message-ID: <4eut19$h0i@news.uni-c.dk>
- References: <846.6602T1051T1356@idefix.wu-wien.ac.at>
- NNTP-Posting-Host: inet.uni-c.dk
- X-Newsreader: TIN [version 1.2 PL2]
-
- Clemens Resanka (h9525967@idefix.wu-wien.ac.at) wrote:
-
- > Is there a way to automatically retrieve www-pages?
-
- Yes, and very simple it is to. You need the program GetURL-1.03.lha from
- aminet.
-
- You need AmiTCP and that program and nothing more, ie, no browsers or
- anything.
-
- Give it a URL and it fetches that page. You can also ask it do it
- recursively, (an amiga Robot! :-) it will then follow all links and fetch
- all pages and graphics etc.
-
- If you don't specify some limits (domain names, nesting levels, url
- patterns) it will try to download all pages around the world :)
-
- Very neat little program.
-
-